123 research outputs found

    Multi-sensory media experiences

    Get PDF
    The way we experience the world is based on our five senses, which allow us unique and often surprising sensations of our environment. Interactive technologies are mainly stimulating our senses of vision and hearing, partly our sense of touch, and the sense of taste and smell are widely under-exploited. There is however a growing international interest of the film, video, and game industries in more immersive viewing and gaming experiences. In the 20th century there was a demand for a controllable way to describe colours that initiated intense research on the descriptions of colours and substantially contributed to advances in computer graphics, image processing, photography and cinematography. Similarly, the 21st century now demands an investigation of touch, taste, and smell as sensory interaction modalities to enhance media experiences

    Reflection on the design of food systems and experiences for sustainable transformations

    Get PDF
    The importance of food and technology in modern society is undeniable. Technological advances have revolutionized how we produce, distribute and prepare food beyond local boundaries, and even how we eat. Eating is one of the most multisensory experiences in everyday life. All of our five senses (i.e. taste, smell, vision, hearing and touch) are involved. We first eat with our eyes, we can smell the food before we taste it, and then experience its textures and flavours in our mouth. However, the experience does not stop there. The sounds that come both from the environment in which we are immersed in while eating and our interactions with the food (e.g. chewing) and utensils we use to eat further influence our eating experiences. In all that, digital technology plays an increasingly important role, especially using emerging immersive technologies such as virtual and augmented reality (VR/AR). Designing at the intersection between technology and food requires multi-stakeholder commitments and a human experience-centred approach. Furthermore, it is essential to look beyond disciplinary boundaries and account for insights on various levels including the perceptual effects, experiential layers and technological advancements

    Mid-air haptic rendering of 2D geometric shapes with a dynamic tactile pointer

    Get PDF
    An important challenge that affects ultrasonic midair haptics, in contrast to physical touch, is that we lose certain exploratory procedures such as contour following. This makes the task of perceiving geometric properties and shape identification more difficult. Meanwhile, the growing interest in mid-air haptics and their application to various new areas requires an improved understanding of how we perceive specific haptic stimuli, such as icons and control dials in mid-air. We address this challenge by investigating static and dynamic methods of displaying 2D geometric shapes in mid-air. We display a circle, a square, and a triangle, in either a static or dynamic condition, using ultrasonic mid-air haptics. In the static condition, the shapes are presented as a full outline in mid-air, while in the dynamic condition, a tactile pointer is moved around the perimeter of the shapes. We measure participants’ accuracy and confidence of identifying shapes in two controlled experiments (n1 = 34, n2 = 25). Results reveal that in the dynamic condition people recognise shapes significantly more accurately, and with higher confidence. We also find that representing polygons as a set of individually drawn haptic strokes, with a short pause at the corners, drastically enhances shape recognition accuracy. Our research supports the design of mid-air haptic user interfaces in application scenarios such as in-car interactions or assistive technology in education

    I’m sensing in the rain: spatial incongruity in visual-tactile mid-air stimulation can elicit ownership in VR users

    Get PDF
    Major virtual reality (VR) companies are trying to enhance the sense of immersion in virtual environments by implementing haptic feedback in their systems (e.g., Oculus Touch). It is known that tactile stimulation adds realism to a virtual environment. In addition, when users are not limited by wearing any attachments (e.g., gloves), it is even possible to create more immersive experiences. Mid-air haptic technology provides contactless haptic feedback and offers the potential for creating such immersive VR experiences. However, one of the limitations of mid-air haptics resides in the need for freehand tracking systems (e.g., Leap Motion) to deliver tactile feedback to the user's hand. These tracking systems are not accurate, limiting designers capability of delivering spatially precise tactile stimulation. Here, we investigated an alternative way to convey incongruent visual-tactile stimulation that can be used to create the illusion of a congruent visual-tactile experience, while participants experience the phenomenon of the rubber hand illusion in VR

    Measuring the added value of haptic feedback

    Get PDF
    While there is an increased appreciation for integrating haptic feedback with audio-visual content, there is still a lack of understanding of how to quantify the added value of touch for a user’s experience (UX) of multimedia content. Here we focus on three main concepts to measure this added value: UX, emotions, and expectations. We present a case study measuring the added value of haptic feedback for a standardized set of audio-visual content (i.e., short video clips), comparing two haptic stimulation modalities (i.e., mid-air vs. vibro-tactile stimuli). Our findings demonstrate that UX of haptically-enhanced audio-visual content is perceived as a more pleasant, unpredictable, and creative experience. The users’ overall liking increases together with a positive change of the users’ expectations, independently from the haptic stimulation modality. We discuss how our approach provides the foundation for future work on developing a measurement model to predict the added value of haptic feedback for users’ experiences within and beyond the multimedia context

    Creating an illusion of movement between the hands using mid-air touch

    Get PDF
    Apparent tactile motion (ATM) has been shown to occur across many contiguous parts of the body, such as fingers, forearms and the back. More recently, the illusion has also been elicited on non-contiguous part of the body, such as from one hand to the other when interconnected or not interconnected by an object in between the hands. Here we explore the reproducibility of the intermanual tactile illusion of movement between two free hands by employing mid-air tactile stimulation. We investigate the optimal parameters to generate a continuous and smooth motion using two arrays of ultrasound speakers, and two stimulation techniques (i.e. static vs. dynamic focal point). In the first experiment, we investigate the occurrence of the illusion when using a static focal point, and we define a perceptive model. In the second experiment, we examine the illusion using a dynamic focal point, defining a second perceptive model. Finally, we discuss the differences between the two techniques

    "Touch me": workshop on tactile user experience evaluation methods

    Get PDF
    In this workshop we plan to explore the possibilities and challenges of physical objects and materials for evaluating the User Experience (UX) of interactive systems. These objects should face shortfalls of current UX evaluation methods and allow for a qualitative (or even quantitative), playful and holistic evaluation of UX -- without interfering with the users' personal experiences during interaction. This provides a tactile enhancement to a solely visual stimulation as used in classical evaluation methods. The workshop serves as a basis for networking and community building with interested HCI researchers, designers and practitioners and should encourage further development of the field of tactile UX evaluation
    • …
    corecore